Optimistic parallelism requires abstractions
نویسندگان
چکیده
منابع مشابه
Optimistic Parallelism on GPUs
We present speculative parallelization techniques that can exploit parallelism in loops even in the presence of dynamic irregularities that may give rise to cross-iteration dependences. The execution of a speculatively parallelized loop consists of five phases: scheduling, computation, misspeculation check, result committing, and misspeculation recovery. While the first two phases enable exploi...
متن کاملAbstractions for Adaptive Data Parallelism
This paper describes how a class of data parallel programs (SPMD) may be expressed using reusable adaptive abstractions. The abstractions support adaptive use of a network of workstations for parallel computing. Although data parallelism is the paradigm considered the programs are expressed using message passing. The main contribution of this paper is to demonstrate how adaptive parallelism may...
متن کاملOptimistic Intra-Transaction Parallelism on Chip Multiprocessors
With the advent of chip multiprocessors, exploiting intra-transaction parallelism is an attractive way of improving transaction performance. However, exploiting intra-transaction parallelism in existing database systems is difficult, for two reasons: first, significant changes are required to avoid races or conflicts within the DBMS, and second, adding threads to transactions requires a high le...
متن کاملHigh-Level Abstractions for Safe Parallelism
Recent research efforts have developed sophisticated type systems for eliminating unwanted interference (i.e., read-write conflicts) from parallel code. While these systems are powerful, they suffer from potential barriers to adoption in that (1) they rely upon complex and/or restrictive features that may be difficult for programmers to understand and use; and (2) they impose a nontrivial annot...
متن کاملObject Template Abstractions for Light-Weight Data-Parallelism
Data-parallelism is a widely used model for parallel programming. Control structures like parallel DO loops, and data structures like collections have been used to express data-parallelism. In typical implementations, these constructs are ' at' in the sense that only one data-parallel operation is active at any time. To model applications that can exploit overlap of synchronization and computat...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Communications of the ACM
سال: 2009
ISSN: 0001-0782,1557-7317
DOI: 10.1145/1562164.1562188